Lagrange method - перевод на немецкий
Diclib.com
Словарь ChatGPT
Введите слово или словосочетание на любом языке 👆
Язык:

Перевод и анализ слов искусственным интеллектом ChatGPT

На этой странице Вы можете получить подробный анализ слова или словосочетания, произведенный с помощью лучшей на сегодняшний день технологии искусственного интеллекта:

  • как употребляется слово
  • частота употребления
  • используется оно чаще в устной или письменной речи
  • варианты перевода слова
  • примеры употребления (несколько фраз с переводом)
  • этимология

Lagrange method - перевод на немецкий

A METHOD TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS
Lagrange Multiplier; Lagrangian multiplier; Lagrangian Multiplier; Lagrangian Function; Lagrangian multipliers; Lagrange multiplier method; LaGrange multiplier; Lagrangian multiplicator; Lagrange's method; Lagrange's undetermined multiplier; Lagrangian function; Lagrange function; Method of Lagrange multipliers; Method of Lagrange Multipliers; Lagrange multiplier principle; Lagrange multipliers; Lagrangian minimization; Lagrange multipliers method; Lagrangian expression

Lagrange method      
Lagrangemethode, (Mathematik) Methode zur Findung von Maximum und Minimum einer Funktion durch multiple Variablen die Begrenzungen unterliegen
teaching method         
GROUP OF METHODS AND PRINCIPLES USED TO TEACH
Teaching methods; Teaching technique; Training method
Lehrmethode, Unterrichtsmethode (Art Wissen zu vermitteln)
scientific method         
  • 0-8132-1347-9}}</ref>
  • alt=
  • isbn=9780486470115}}</ref>
  • Einstein's prediction (1907): Light bends in a gravitational field]]
  • quote=Alhazen (or Al-Haytham; 965–1039 CE) was perhaps one of the greatest physicists of all times and a product of the Islamic Golden Age or Islamic Renaissance (7th–13th centuries). He made significant contributions to anatomy, astronomy, engineering, [[mathematics]], medicine, ophthalmology, philosophy, physics, psychology, and visual perception and is primarily attributed as the inventor of the scientific method, for which author Bradley Steffens (2006) describes him as the "first scientist".}}</ref><ref name=treatiseOnLight/>
  • C.&nbsp;S. Peirce]], c. 1896, on Kepler's reasoning through explanatory hypotheses<ref>Peirce, C.S., ''Collected Papers'' v. 1, paragraph 74.</ref>
  • title=ESO Telescope Sees Star Dance Around Supermassive Black Hole, Proves Einstein Right}}</ref>
MATHEMATICAL AND EXPERIMENTAL TECHNIQUES EMPLOYED IN THE NATURAL SCIENCES; MORE SPECIFICALLY, TECHNIQUES USED IN THE CONSTRUCTION AND TESTING OF SCIENTIFIC HYPOTHESES
Scientific Method; Unscientific; The scientific method; Interpretations of the scientific method; Scientific thinking; Scientific reseach; Scientific research; Scientific process; Scientifically proven; Scientific analysis; Critical method; Process (science); Process(science); Summary of the Steps in the Scientific Method; Scientific Research; Scientific Investigation; Scientific methodology; Scientific researcher; Scientific studies; Scientific conflict; The Method of Tenacity; Method of Tenacity; Scientific methods; Steps in the research process; Scientific claim; Methodology of science; Scientific burden of proof; Systematic investigation; Research cycle; Scientific validation; Science method; Science methods; Scientific enquiry; Scientific approach; Scientific principles; Scientific operation; Experimental confirmation
wissenschaftliche Methode

Определение

class method
<programming> A kind of method, available in some object-oriented programming languages, that operates on the class as a whole, as opposed to an object method that operates on an object that is an instance of the class. A typical example of a class method would be one that keeps a count of the number of objects of that class that have been created. (2000-03-22)

Википедия

Lagrange multiplier

In mathematical optimization, the method of Lagrange multipliers is a strategy for finding the local maxima and minima of a function subject to equality constraints (i.e., subject to the condition that one or more equations have to be satisfied exactly by the chosen values of the variables). It is named after the mathematician Joseph-Louis Lagrange. The basic idea is to convert a constrained problem into a form such that the derivative test of an unconstrained problem can still be applied. The relationship between the gradient of the function and gradients of the constraints rather naturally leads to a reformulation of the original problem, known as the Lagrangian function.

The method can be summarized as follows: In order to find the maximum or minimum of a function   f ( x )   {\displaystyle \ f(x)\ } subjected to the equality constraint   g ( x ) = 0   , {\displaystyle \ g(x)=0\ ,} form the Lagrangian function,

  L ( x , λ ) f ( x ) + λ g ( x )   , {\displaystyle \ {\mathcal {L}}(x,\lambda )\equiv f(x)+\lambda \cdot g(x)\ ,}

and find the stationary points of   L   {\displaystyle \ {\mathcal {L}}\ } considered as a function of   x   {\displaystyle \ x\ } and the Lagrange multiplier   λ   . {\displaystyle \ \lambda ~.} This means that all partial derivatives should be zero, including the partial derivative with respect to   λ   . {\displaystyle \ \lambda ~.}

    L   x = 0 {\displaystyle \ {\frac {\ \partial {\mathcal {L}}\ }{\partial x}}=0\qquad } and   L   λ = 0   ; {\displaystyle \qquad {\frac {\ \partial {\mathcal {L}}\ }{\partial \lambda }}=0\ ;}

or equivalently

    f ( x )   x + λ   g ( x )   x = 0 {\displaystyle \ {\frac {\ \partial f(x)\ }{\partial x}}+\lambda \cdot {\frac {\ \partial g(x)\ }{\partial x}}=0\qquad } and g ( x ) = 0   . {\displaystyle \qquad g(x)=0~.}

The solution corresponding to the original constrained optimization is always a saddle point of the Lagrangian function, which can be identified among the stationary points from the definiteness of the bordered Hessian matrix.

The great advantage of this method is that it allows the optimization to be solved without explicit parameterization in terms of the constraints. As a result, the method of Lagrange multipliers is widely used to solve challenging constrained optimization problems. Further, the method of Lagrange multipliers is generalized by the Karush–Kuhn–Tucker conditions, which can also take into account inequality constraints of the form   h ( x ) c   {\displaystyle \ h(\mathbf {x} )\leq c\ } for a given constant   c   . {\displaystyle \ c~.}